235 research outputs found
Nonlinear trend removal should be carefully performed in heart rate variability analysis
Background : In Heart rate variability analysis, the rate-rate time
series suffer often from aperiodic non-stationarity, presence of ectopic beats
etc. It would be hard to extract helpful information from the original signals.
10 Problem : Trend removal methods are commonly practiced to reduce
the influence of the low frequency and aperiodic non-stationary in RR data.
This can unfortunately affect the signal and make the analysis on detrended
data less appropriate. Objective : Investigate the detrending effect
(linear \& nonlinear) in temporal / nonliear analysis of heart rate variability
of long-term RR data (in normal sinus rhythm, atrial fibrillation, 15
congestive heart failure and ventricular premature arrhythmia conditions).
Methods : Temporal method : standard measure SDNN; Nonlinear methods
: multi-scale Fractal Dimension (FD), Detrended Fluctuation Analysis (DFA) \&
Sample Entropy (Sam-pEn) analysis. Results : The linear detrending
affects little the global characteristics of the RR data, either 20 in temporal
analysis or in nonlinear complexity analysis. After linear detrending, the
SDNNs are just slightly shifted and all distributions are well preserved. The
cross-scale complexity remained almost the same as the ones for original RR
data or correlated. Nonlinear detrending changed not only the SDNNs
distribution, but also the order among different types of RR data. After this
processing, the SDNN became indistinguishable be-25 tween SDNN for normal sinus
rhythm and ventricular premature beats. Different RR data has different
complexity signature. Nonlinear detrending made the all RR data to be similar ,
in terms of complexity. It is thus impossible to distinguish them. The FD
showed that nonlinearly detrended RR data has a dimension close to 2, the
exponent from DFA is close to zero and SampEn is larger than 1.5 -- these
complexity values are very close to those for 30 random signal.
Conclusions : Pre-processing by linear detrending can be performed on RR data,
which has little influence on the corresponding analysis. Nonlinear detrending
could be harmful and it is not advisable to use this type of pre-processing.
Exceptions do exist, but only combined with other appropriate techniques to
avoid complete change of the signal's intrinsic dynamics. 35 Keywords
heart rate variability linear / nonlinear detrending
complexity analysis mul-tiscale analysis detrended
fluctuation analysis fractal dimension sample entropy
A DTN routing scheme for quasi-deterministic networks with application to LEO satellites topology
We propose a novel DTN routing algorithm, called DQN, specifically designed for quasi-deterministic networks with an application to satellite constellations. We demonstrate that our proposal efficiently forwards the information over a satellite network derived from the Orbcomm topology while keeping a low replication overhead. We compare our algorithm against other well-known DTN routing schemes and show that we obtain the lowest replication ratio without the knowledge of the topology and with a delivery ratio of the same order of magnitude than a reference theoretical optimal routing
Classification of Cardiac Arrhythmia in vitro based on Multivariate Complexity Analysis
International audienceBackground: The animal models (in vitro or in vivo) provide an excel-lent tool to study heart diseases, among them the arrhythmia remains one of the most active research subjects. It can be induced or treated by drugs, electrical stimulation, hypothermia etc.Problems: However, the inducing or treating effects in cardiac culture often happened long after the initial applications or in some relatively short time windows. So, it is necessary to capture and classify rapidly the signal change. Human-assisted monitoring is time-consuming and less efficient. An automatic classification method for real-time use would be useful and necessary. Methods: Since electrocardiological signals are features by repetitive or similar patterns reflecting the intrinsic information about the pa-tient (or culture), analyzing these patterns could help not only to monitor the status's change but also to evaluate/explore the physiol-ogic control mechanisms. Methods based on complexity analysis are of considerable interest in this case. Aims: Compare different complexity analysis methods in order to find the most appropriate ones to discriminate the normal cardiac signals from arrhythmic ones acquired from a cardiac cell culture in vitro. The selected features are then used by a SVM classifier.Results: Among the six complexity analysis methods, Time Lagging (TLag) method allowed obtaining the best discrimination index (nor-mal vs. arrhythmic, p-value, 9e-23). The proposed Modified Hurst Exponent (HExM) showed better performance than original Hurst Exponent with well-improved p-value (from 0.019 to 2e-9). The Ap-proximate Entropy (ApEn), Sample Entropy (SampEn) and Detrended Fluctuation Analysis gave good discrimination ratio but with larger p-values (at order 10^{-3}). Combination of TLag, HExM and ApEn can provide a more robust classifier and allow monitoring and classifying in an automatic way the electrical activities' changes in the cardiac cultures
Considering New Regularization Parameter-Choice Techniques for the Tikhonov Method to Improve the Accuracy of Electrocardiographic Imaging
The electrocardiographic imaging (ECGI) inverse problem highly relies on adding constraints, a process called regularization, as the problem is ill-posed. When there are no prior information provided about the unknown epicardial potentials, the Tikhonov regularization method seems to be the most commonly used technique. In the Tikhonov approach the weight of the constraints is determined by the regularization parameter. However, the regularization parameter is problem and data dependent, meaning that different numerical models or different clinical data may require different regularization parameters. Then, we need to have as many regularization parameter-choice methods as techniques to validate them. In this work, we addressed this issue by showing that the Discrete Picard Condition (DPC) can guide a good regularization parameter choice for the two-norm Tikhonov method. We also studied the feasibility of two techniques: The U-curve method (not yet used in the cardiac field) and a novel automatic method, called ADPC due its basis on the DPC. Both techniques were tested with simulated and experimental data when using the method of fundamental solutions as a numerical model. Their efficacy was compared with the efficacy of two widely used techniques in the literature, the L-curve and the CRESO methods. These solutions showed the feasibility of the new techniques in the cardiac setting, an improvement of the morphology of the reconstructed epicardial potentials, and in most of the cases of their amplitude
DTN routing for quasi-deterministic networks with application to LEO constellations
We propose a novel DTN routing algorithm, called DQN, specifically designed for quasi-deterministic networks with an application to satellite constellations. We demonstrate that our proposal efficiently forwards the information over a satellite network derived from the Orbcomm topology while keeping a low replication overhead. We compare our algorithm against other well-known DTN routing schemes and show that we obtain the lowest replication ratio with a delivery ratio of the same order of magnitude than a reference theoretical optimal routing. We also analyze the impact of terrestrial gateways density and analyze DQN performances in heterogeneous cases
Impact of the Endocardium in a Parameter Optimization to Solve the Inverse Problem of Electrocardiography
Electrocardiographic imaging aims at reconstructing cardiac electrical events from electrical signals measured on the body surface. The most common approach relies on the inverse solution of the Laplace equation in the torso to reconstruct epicardial potential maps from body surface potential maps. Here we apply a method based on a parameter identification problem to reconstruct both activation and repolarization times. From an ansatz of action potential, based on the Mitchell-Schaeffer ionic model, we compute body surface potential signals. The inverse problem is reduced to the identification of the parameters of the Mitchell-Schaeffer model. We investigate whether solving the inverse problem with the endocardium improves the results or not. We solved the parameter identification problem on two different meshes: one with only the epicardium, and one with both the epicardium and the endocardium. We compared the results on both the heart (activation and repolarization times) and the torso. The comparison was done on validation data of sinus rhythm and ventricular pacing. We found similar results with both meshes in 6 cases out of 7: the presence of the endocardium slightly improved the activation times. This was the most visible on a sinus beat, leading to the conclusion that inclusion of the endocardium would be useful in situations where endo-epicardial gradients in activation or repolarization times play an important role
A Singularity-analysis Approach to characterize Epicardial Electric Potential
International audienceThe cardiac electrical activity conforms a complex sys- tem, for which nonlinear signal-processing is required to characterize it properly. In this context, an analysis in terms of singularity exponents is shown to provide compact and meaningful descriptors of the structure and dynam- ics. In particular, singularity components reconstruct the epicardial electric potential maps of human atria, inverse- mapped from surface potentials; such approach describe sinus-rhythm dynamics as well as atrial flutter and atrial fibrillation. We present several example cases in which the key descriptors in the form of fast-slow dynamics point at the arrhythmogenic areas in the atria
Smartphone Based 3D Navigation Techniques in an Astronomical Observatory Context: Implementation and Evaluation in a Software Platform
International audience3D Virtual Environments (3DVE) come up as a good solution to transmit knowledge in a museum exhibit. In such contexts, providing easy to learn and to use interaction techniques which facilitate the handling inside a 3DVE is crucial to maximize the knowledge transfer. We took the opportunity to design and implement a software platform for explaining the behavior of the Telescope Bernard-Lyot to museum visitors on top of the Pic du Midi. Beyond the popularization of a complex scientific equipment, this platform constitutes an open software environment to easily plug different 3D interaction techniques. Recently, popular use of a smartphones as personal handled computer lets us envision the use of a mobile device as an interaction support with these 3DVE. Accordingly, we design and propose how to use the smartphone as a tangible object to navigate inside a 3DVE. In order to prove the interest in the use of smartphones, we compare our solution with available solutions: keyboard-mouse and 3D mouse. User experiments confirmed our hypothesis and particularly emphasizes that visitors find our solution more attractive and stimulating. Finally, we illustrate the benefits of our software framework by plugging alternative interaction techniques for supporting selection and manipulation task in 3D
- âŠ